A Crowdsourcing Approach to Collecting 399 Tutorial Videos on Logarithms
نویسندگان
چکیده
We investigated the feasibility of crowdsourcing fullfledged tutorial videos from ordinary people on the Web on how to solve math problems related to logarithms. This kind of approach (a form of learnersourcing [10, 12]) to efficiently collecting tutorial videos and other learning resources could be useful for realizing personalized learning-at-scale, whereby students receive specific learning resources – drawn from a large and diverse set – that are tailored to their individual and time-varying needs. Results of our study, in which we collected 399 videos from 66 unique “teachers” on Mechanical Turk, suggest that (1) approximately 100 videos – over 80% of which are mathematically fully correct – can be crowdsourced per week for $5/video; (2) the crowdsourced videos exhibit significant diversity in terms of language style, presentation media, and pedagogical approach; (3) the average learning gains (posttest minus pretest score) associated with watching the videos was stat. sig. higher than for a control video (0.105 versus 0.045); and (4) the average learning gains (0.1416) from watching the best tested crowdsourced videos was comparable to the learning gains (0.1506) from watching a popular Khan Academy video on logarithms. INTRODUCTION & RELATED WORK The goal of personalized learning, in which students’ learning experiences are tailored to their individual and time-varying needs, has been been pursued by psychologists, computer scientists, and educational researchers for over five decades. Over the years, personalized learning systems have taken various forms: computer-aided instruction systems in the 1960-1970s; intelligent tutoring systems in the 1980-2000s [22, 1, 26]; web-based e-learning platforms in 2000-2010s [4, 9, 5]; and adaptive learning engines – as developed by companies such as Knewton, Pearson, and McGraw-Hill – from 2010present. From an abstract perspective, the common goal of all these technologies is to provide each student at each moment in time with specific learning resources – e.g., illuminating tutorials of key concepts, edifying practice problems, helpful explanations of how to solve these problems, etc. – that can help students to learn more effectively than they could with a one-size-fits-all instructional approach. A key challenge when developing personalized learning systems is how to efficiently collect a set of learning resources that are used to personalize instruction. Without a sufficiently large and diverse set of resources from which to draw, personalized learning may not offer much advantage over traditional, single-path instruction. Intelligent tutoring systems in particular, for which the empirical benefits of personalized learning are arguably strongest [18, 13], can be extremely laborious to create, and a significant part of the effort that must be invested is in the creation of good explanations and hints [17]. Moreover, in order to be maximally effective, personalized learning systems should consider interactions between the learner and the resources with which they interact: these interactions could be based on shared demographics of the learner and the teacher (e.g., role model effects [19, 16, 6]), language complexity of the resource that is tuned to the proficiency of the learner [7], affective sentiment (e.g., enthusiasm [14], humor [27]) of the resource that matches the emotional state of the learner, and more. Unfortunately, as the number of possible interactions between learners and resources increases, the problem of how to collect a large and diverse enough pool of resources becomes increasingly severe. One recently proposed and promising approach to collecting and curating large volumes of educational resources is to crowdsource data from learners themselves. This process, sometimes known as learnersourcing, has been used, for example, to identify which parts of lecture videos are confusing [10], and to describe the key instructional steps [12] and subgoals [11] of “how-to” videos. More recently, learnersourcing has been used not only to annotate existing educational content, but also to create novel content itself. In particular, [25] explored a crowdsourcing-based strategy toward personalized learning in which learners were asked to author paragraphs of text explaining how to solve statistics problems. The explanations generated by learners were found to be comar X iv :1 60 6. 09 61 0v 2 [ cs .H C ] 1 7 Ja n 20 17 parable in both learning benefit and rated quality to explanations produced by expert instructors. In this paper, we too explore an approach to efficiently collecting a large and diverse set of learning resources that is based on crowdsourcing. However, in contrast to [25], in which short text-based explanations were gathered from learners who were already engaged in a learning task, our work is concerned with asking ordinary people from a crowdsourcing web site to take on the role of a teacher (which has been dubbed “teachersourcing” [8]) and to create novel, full-fledged, video-based explanations that provide worked examples [3] of how to solve a variety of mathematics problems that could potentially help math students to learn. In contrast to static text, multimedia videos such as whiteboard animations can help to focus students’ attention on the most salient parts of an explanation – e.g., by pointing to a specific mathematical expression with the mouse pointer while talking. Moreover, some students may find video to be more engaging than text, and there is preliminary evidence from the education literature that multimedia presentations lead to greater knowledge retention compared to static text-based presentations [21]. We note that the effort involved for the “teachers” in creating these videos is considerable – often an hour or more of total time according to self-reports by the participants in our study. It is thus unclear how many people on crowdsourcing websites such as Mechanical Turk would even respond to such a task, and even less clear how useful such crowdsourced explanations might be in terms of helping students to learn. This paper describes what we believe to be the first investigation into crowdsourcing entire tutorial videos from ordinary people on the Web. In particular, the rest of the paper investigates the following research questions: 1. How can we design a crowdsourcing task to convince ordinary people to create, for a modest amount of compensation, a novel tutorial video (not just a link to an existing video) that might realistically be used to help students learn? What is the throughput (videos/week) that we can attain, and how many of these videos are mathematically correct? 2. What kinds of qualitative diversity – e.g., presentation style, pegagogical approach, language style – do the crowdsourced videos exhibit? 3. How effective are these videos in helping students learn about the subject matter they are supposed to explain? How do they compare with a video produced by Khan Academy? EXPERIMENT I: CROWDSOURCING VIDEOS In this study we focused on crowdsourcing tutorial videos that explain how to simplify mathematical expressions and solve equations involving logarithms. Logarithms are well-suited for this study because many people know what they are; many other people – even those who once Basic Logarithms Simplify: log3 1 = log9 1 = log 100 = log 1 5 125 = log1
منابع مشابه
Learnersourcing: Improving Learning with Collective Learner Activity
Millions of learners today are watching videos on online platforms, such as Khan Academy, YouTube, Coursera, and edX, to take courses and master new skills. But existing video interfaces are not designed to support learning, with limited interactivity and lack of information about learners’ engagement and content. Making these improvements requires deep semantic information about video that eve...
متن کاملPerformance Tournaments with Crowdsourced Judges
A performance slam is a competition among a xed set of performances whereby pairs of performances are judged by audience participants. When performances are recorded on electronic media, performance slams become amenable to audiences that watch online and judge asynchronously ( crowdsourced ). In order to better entertain the audience, we want to show the better performances ( exploitation ). I...
متن کاملIdentifying Unsafe Videos on Online Public Media using Real-time Crowdsourcing
Due to the significant growth of social networking and human activities through the web in recent years, attention to analyzing big data using real-time crowdsourcing has increased. This data may appear in the form of streaming images, audio or videos. In this paper, we address the problem of deciding the appropriateness of streaming videos in public media with the help of crowdsourcing in real...
متن کاملCrowdsourcing Event Detection in YouTube Videos
Considerable efforts have been put into making video content on the Web more accessible, searchable, and navigable by research on both textual and visual analysis of the actual video content and the accompanying metadata. Nevertheless, most of the time, videos are opaque objects in websites. With Web browsers gaining more support for the HTML5 element, videos are becoming first class ci...
متن کاملMuch Ado About Time: Exhaustive Annotation of Temporal Data
Large-scale annotated datasets allow AI systems to learn from and build upon the knowledge of the crowd. Many crowdsourcing techniques have been developed for collecting image annotations. These techniques often implicitly rely on the fact that a new input image takes a negligible amount of time to perceive. In contrast, we investigate and determine the most cost-effective way of obtaining high...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1606.09610 شماره
صفحات -
تاریخ انتشار 2016